2 research outputs found

    Optical Axons for Electro-Optical Neural Networks

    Get PDF
    Recently, neuromorphic sensors, which convert analogue signals to spiking frequencies, have ‎been reported for neurorobotics. In bio-inspired systems these sensors are connected to the main neural unit to perform ‎post-processing of the sensor data. The performance of spiking neural networks has been ‎improved using optical synapses, which offer parallel communications between the distanced ‎neural areas but are sensitive to the intensity variations of the optical signal. For systems with ‎several neuromorphic sensors, which are connected optically to the main unit, the use of ‎optical synapses is not an advantage. To address this, in this paper we propose and ‎experimentally verify optical axons with synapses activated optically using digital signals. The ‎synaptic weights are encoded by the energy of the stimuli, which are then optically transmitted ‎independently. We show that the optical intensity fluctuations and link’s misalignment result ‎in delay in activation of the synapses. For the proposed optical axon, we have demonstrated line of ‎sight transmission over a maximum link length of 190 cm with a delay of 8 μs. Furthermore, we ‎show the axon delay as a function of the illuminance using a fitted model for which the root mean square error (RMS) ‎similarity is 0.95

    The Influence of the Number of Spiking Neurons on Synaptic Plasticity

    No full text
    The main advantages of spiking neural networks are the high biological plausibility and their fast response due to spiking behaviour. The response time decreases significantly in the hardware implementation of SNN because the neurons operate in parallel. Compared with the traditional computational neural network, the SNN use a lower number of neurons, which also reduces their cost. Another critical characteristic of SNN is their ability to learn by event association that is determined mainly by postsynaptic mechanisms such as long-term potentiation. However, in some conditions, presynaptic plasticity determined by post-tetanic potentiation occurs due to the fast activation of presynaptic neurons. This violates the Hebbian learning rules that are specific to postsynaptic plasticity. Hebbian learning improves the SNN ability to discriminate the neural paths trained by the temporal association of events, which is the key element of learning in the brain. This paper quantifies the efficiency of Hebbian learning as the ratio between the LTP and PTP effects on the synaptic weights. On the basis of this new idea, this work evaluates for the first time the influence of the number of neurons on the PTP/LTP ratio and consequently on the Hebbian learning efficiency. The evaluation was performed by simulating a neuron model that was successfully tested in control applications. The results show that the firing rate of postsynaptic neurons post depends on the number of presynaptic neurons pre, which increases the effect of LTP on the synaptic potentiation. When post activates at a requested rate, the learning efficiency varies in the opposite direction with the number of pres, reaching its maximum when fewer than two pres are used. In addition, Hebbian learning is more efficient at lower presynaptic firing rates that are divisors of the target frequency of post. This study concluded that, when the electronic neurons additionally model presynaptic plasticity to LTP, the efficiency of Hebbian learning is higher when fewer neurons are used. This result strengthens the observations of our previous research where the SNN with a reduced number of neurons could successfully learn to control the motion of robotic fingers
    corecore